<?xml version="1.0" encoding="ISO-8859-1"?>
<metadatalist>
	<metadata ReferenceType="Conference Proceedings">
		<site>sibgrapi.sid.inpe.br 802</site>
		<holdercode>{ibi 8JMKD3MGPEW34M/46T9EHH}</holdercode>
		<identifier>6qtX3pFwXQZG2LgkFdY/LMmuu</identifier>
		<repository>sid.inpe.br/sibgrapi@80/2006/07.17.10.15</repository>
		<lastupdate>2006:07.17.10.15.54 sid.inpe.br/banon/2001/03.30.15.38 administrator</lastupdate>
		<metadatarepository>sid.inpe.br/sibgrapi@80/2006/07.17.10.15.55</metadatarepository>
		<metadatalastupdate>2022:06.14.00.13.13 sid.inpe.br/banon/2001/03.30.15.38 administrator {D 2006}</metadatalastupdate>
		<doi>10.1109/SIBGRAPI.2006.13</doi>
		<citationkey>CamposMayoMurr:2006:DiAtWe</citationkey>
		<title>Directing the attention of a wearable camera by pointing gestures</title>
		<format>On-line</format>
		<year>2006</year>
		<numberoffiles>1</numberoffiles>
		<size>334 KiB</size>
		<author>de Campos, Teofilo,</author>
		<author>Mayol Cuevas, Walterio W.,</author>
		<author>Murray, David W.,</author>
		<affiliation>Department of Engineering Science, University of Oxford</affiliation>
		<affiliation>Department of Computer Science, University of Bristol</affiliation>
		<affiliation>Department of Engineering Science, University of Oxford</affiliation>
		<editor>Oliveira Neto, Manuel Menezes de,</editor>
		<editor>Carceroni, Rodrigo Lima,</editor>
		<e-mailaddress>teo@robots.ox.ac.uk</e-mailaddress>
		<conferencename>Brazilian Symposium on Computer Graphics and Image Processing, 19 (SIBGRAPI)</conferencename>
		<conferencelocation>Manaus, AM, Brazil</conferencelocation>
		<date>8-11 Oct. 2006</date>
		<publisher>IEEE Computer Society</publisher>
		<publisheraddress>Los Alamitos</publisheraddress>
		<booktitle>Proceedings</booktitle>
		<tertiarytype>Full Paper</tertiarytype>
		<transferableflag>1</transferableflag>
		<versiontype>finaldraft</versiontype>
		<keywords>3D hand tracking, hand detection, wearable robots.</keywords>
		<abstract>Wearable visual sensors provide views of the environment which are rich in information about the wearers location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the users focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences.</abstract>
		<language>en</language>
		<targetfile>deCamposT-Gestures.pdf</targetfile>
		<usergroup>teo.decampos administrator</usergroup>
		<visibility>shown</visibility>
		<nexthigherunit>8JMKD3MGPEW34M/46RFT7E</nexthigherunit>
		<nexthigherunit>8JMKD3MGPEW34M/4742MCS</nexthigherunit>
		<citingitemlist>sid.inpe.br/sibgrapi/2022/05.08.00.20 5</citingitemlist>
		<hostcollection>sid.inpe.br/banon/2001/03.30.15.38</hostcollection>
		<lasthostcollection>sid.inpe.br/banon/2001/03.30.15.38</lasthostcollection>
		<url>http://sibgrapi.sid.inpe.br/rep-/sid.inpe.br/sibgrapi@80/2006/07.17.10.15</url>
	</metadata>
</metadatalist>